A Comparison of Pruning Algorithms for Sparse Least Squares Support Vector Machines

نویسندگان

  • Luc Hoegaerts
  • Johan A. K. Suykens
  • Joos Vandewalle
  • Bart De Moor
چکیده

Least Squares Support Vector Machines (LS-SVM) is a proven method for classification and function approximation. In comparison to the standard Support Vector Machines (SVM) it only requires solving a linear system, but it lacks sparseness in the number of solution terms. Pruning can therefore be applied. Standard ways of pruning the LSSVM consist of recursively solving the approximation problem and subsequently omitting data that have a small error in the previous pass and are based on support values. We suggest a slightly adapted variant that improves the performance significantly. We assess the relative regression performance of these pruning schemes in a comparison with two (for pruning adapted) subset selection schemes, -one based on the QR decomposition (supervised), one that searches the most representative feature vector span (unsupervised)-, random omission and backward selection on independent test sets in some benchmark experiments.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Comments on "Pruning Error Minimization in Least Squares Support Vector Machines"

In this letter, we comment on "Pruning Error Minimization in Least Squares Support Vector Machines" by B. J. de Kruif and T. J. A. de Vries. The original paper proposes a way of pruning training examples for least squares support vector machines (LS SVM) using no regularization (-gamma = infinity). This causes a problem as the derivation involves inverting a matrix that is often singular. We di...

متن کامل

ECT and LS-SVM Based Void Fraction Measurement of Oil-Gas Two-Phase Flow

A method based on Electrical Capacitance Tomography (ECT) and an improved Least Squares Support Vector Machine (LS-SVM) is proposed for void fraction measurement of oil-gas two-phase flow. In the modeling stage, to solve the two problems in LS-SVM, pruning skills are employed to make LS-SVM sparse and robust; then the Real-Coded Genetic Algorithm is introduced to solve the difficult problem...

متن کامل

Pii: S0925-2312(01)00644-0

Least squares support vector machines (LS-SVM) is an SVM version which involves equality instead of inequality constraints and works with a least squares cost function. In this way, the solution follows from a linear Karush–Kuhn–Tucker system instead of a quadratic programming problem. However, sparseness is lost in the LS-SVM case and the estimation of the support values is only optimal in the...

متن کامل

Selection methods for extended least squares support vector machines

Support vector machines (SVMs), have proven to be effective for solving learning problems, and have been successfully applied to a large number of tasks. Lately a new technique, the Least Squares SVM (LS-SVM) has been introduced. This least squares version simplifies the required computation, but sparseness –a really attractive feature of the standard SVM– is lost. To reach a sparse model, furt...

متن کامل

A Robust LS-SVM Regression

In comparison to the original SVM, which involves a quadratic programming task; LS–SVM simplifies the required computation, but unfortunately the sparseness of standard SVM is lost. Another problem is that LS-SVM is only optimal if the training samples are corrupted by Gaussian noise. In Least Squares SVM (LS–SVM), the nonlinear solution is obtained, by first mapping the input vector to a high ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2004